Cascade neural networks with node-decoupled extended Kalman filtering
نویسندگان
چکیده
Most neural networks used today rely on rigid, fixed-architecture networks and/or slow, gradient descent-based training algorithms (e. g. backpropagation). In this paper, we propose a new neural network learning architecture to counter these problems. Namely, we combine (1) flexible cascade neural networks, which dynamically adjust the size of the neural network as part of the learning process, and (2) node-decoupled extended Kalman filtering (NDEKF), a fast converging alternative to backpropagation. In this paper, we first summarize how learning proceeds in cascade neural networks. We then show how NDEKF fits seamlessly into the cascade learning framework, and how cascade learning addresses the poor local minima problem of NDEKF reported in [1]. We analyze the computational complexity of our approach and compare it to fixed-architecture training paradigms. Finally, we report learning results for continuous function approximation and dynamic system identification — results which show substantial improvement in learning speed and error convergence over other neural network training methods.
منابع مشابه
Sensorless Speed Control of Double Star Induction Machine With Five Level DTC Exploiting Neural Network and Extended Kalman Filter
This article presents a sensorless five level DTC control based on neural networks using Extended Kalman Filter (EKF) applied to Double Star Induction Machine (DSIM). The application of the DTC control brings a very interesting solution to the problems of robustness and dynamics. However, this control has some drawbacks such as the uncontrolled of the switching frequency and the strong ripple t...
متن کاملOnline Symbolic-Sequence Prediction with Recurrent Neural Networks
This paper studies the use of recurrent neural networks for predicting the next symbol in a sequence. The focus is on online prediction, a task much harder than the classical offline grammatical inference with neural networks. Different kinds of sequence sources are considered: finitestate machines, chaotic sources, and texts in human language. Two algorithms are used for network training: real...
متن کاملImproving Long-Term Online Prediction with Decoupled Extended Kalman Filters
Long Short-Term Memory (LSTM) recurrent neural networks (RNNs) outperform traditional RNNs when dealing with sequences involving not only short-term but also long-term dependencies. The decoupled extended Kalman filter learning algorithm (DEKF) works well in online environments and reduces significantly the number of training steps when compared to the standard gradient-descent algorithms. Prev...
متن کاملOn-Line Nonlinear Dynamic Data Reconciliation Using Extended Kalman Filtering: Application to a Distillation Column and a CSTR
Extended Kalman Filtering (EKF) is a nonlinear dynamic data reconciliation (NDDR) method. One of its main advantages is its suitability for on-line applications. This paper presents an on-line NDDR method using EKF. It is implemented for two case studies, temperature measurements of a distillation column and concentration measurements of a CSTR. In each time step, random numbers with zero m...
متن کاملOnline Symbolic-Sequence Prediction with Discrete-Time Recurrent Neural Networks
This paper studies the use of discrete-time recurrent neural networks for predicting the next symbol in a sequence. The focus is on online prediction, a task much harder than the classical offline grammatical inference with neural networks. The results obtained show that the performance of recurrent networks working online is acceptable when sequences come from finite-state machines or even fro...
متن کامل